Extended inequalities for weighted Renyi entropy involving generalized Gaussian densities

نویسنده

  • Salimeh Yasaei Sekeh
چکیده

In this paper the author analyzes the weighted Renyi entropy in order to derive several inequalities in weighted case. Furthermore, using the proposed notions α-th generalized deviation and (α, p)-th weighted Fisher information, extended versions of the moment-entropy, Fisher information and Cramér-Rao inequalities in terms of generalized Gaussian densities are given. 1 The weighted p-Renyi entropy In 1960, Renyi was looking for the most general definition of information measures that would preserve the additivity for independent events and was compatible with the axioms of probability. He started with Cauchy’s functional equation and he ended up with the general theory of means. The consequence of this investigation derived the definition of Renyi entropy, see [16]. In addition, Stam [19] showed that a continuous random variable with given Fisher information and minimal Shannon entropy must be Gaussian. However the moment-entropy inequality established the result saying: a continuous random variable with given second moment and maximal Shannon entropy must be Gaussian as well. Cf. [10, 25, 14, 6]. Furthermore, The Cramér-Rao inequality shows that the second moment of a continuous random variable is bounded by the reciprocal of its Fisher information. Cf. [7]. Currently in [13], the notions of relative Renyi entropy, (α,p)-th Fisher information, as a general form of Fisher information associated with Renyi entropy including α-th moment and deviation has been introduced. More results and certain applications emerged in [5, 11, 2, 3]. Later, an interesting generalization of Stam’s inequality involving the Renyi entropy was given, check again [13], wherein it has been asserts that the generalized Gaussian densities maximizes the Renyi entropy with given generalized Fisher information. The initial concept of the weighted entropy as another generalization of entropy was proposed in [1, 8, 4]. Certain applications of the weighted entropy has been presented in information theory and computer science (see [17, 18, 21, 24, 22]). Let us now give the definition of the weighted entropy. For given function x ∈ R 7→ φ(x) ≥ 0, and an RV X : Ω → R, with a PM/DF f , the weighted entropy (WE) of X (or f) with weight function (WF) φ is defined by hφ (X) = h w φ(f) = − ∫ R φ(x)f(x) log f(x)dx = −EX(φ log f) (1.1) 2010 Mathematics Subject Classification: 60A10, 60B05, 60C05.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Cramér-Rao and moment-entropy inequalities for Renyi entropy and generalized Fisher information

The moment-entropy inequality shows that a continuous random variable with given second moment and maximal Shannon entropy must be Gaussian. Stam’s inequality shows that a continuous random variable with given Fisher information and minimal Shannon entropy must also be Gaussian. The CramérRao inequality is a direct consequence of these two inequalities. In this paper the inequalities above are ...

متن کامل

On a (\beta,q)-generalized Fisher information and inequalities involving q-Gaussian distributions

On a (β, q)-generalized Fisher information and inequalities involving q-Gaussian distributions a) In the present paper, we would like to draw attention to a possible generalized Fisher information that fits well in the formalism of nonextensive thermostatistics. This generalized Fisher information is defined for densities on R n. Just as the maximum Rényi or Tsallis entropy subject to an ellipt...

متن کامل

The Rate of Entropy for Gaussian Processes

In this paper, we show that in order to obtain the Tsallis entropy rate for stochastic processes, we can use the limit of conditional entropy, as it was done for the case of Shannon and Renyi entropy rates. Using that we can obtain Tsallis entropy rate for stationary Gaussian processes. Finally, we derive the relation between Renyi, Shannon and Tsallis entropy rates for stationary Gaussian proc...

متن کامل

Draft #1 On a (β, q)-generalized Fisher information and inequalities involving q-Gaussian distributions

In the present paper, we would like to draw attention to a possible generalized Fisher information that ts well in the formalism of nonextensive thermostatistics. This generalized Fisher information is de ned for densities on R. Just as the maximum Rényi or Tsallis entropy subject to an elliptic moment constraint is a generalized qGaussian, we show that the minimization of the generalized Fishe...

متن کامل

Strong Convergence of Weighted Sums for Negatively Orthant Dependent Random Variables

We discuss in this paper the strong convergence for weighted sums of negatively orthant dependent (NOD) random variables by generalized Gaussian techniques. As a corollary, a Cesaro law of large numbers of i.i.d. random variables is extended in NOD setting by generalized Gaussian techniques.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1509.02190  شماره 

صفحات  -

تاریخ انتشار 2015